|
In probability theory, it is possible to approximate the moments of a function ''f'' of a random variable ''X'' using Taylor expansions, provided that ''f'' is sufficiently differentiable and that the moments of ''X'' are finite. This technique is often used by statisticians. ==First moment== : Notice that , the 2nd term disappears. Also is . Therefore, : where and are the mean and variance of X respectively.〔Haym Benaroya, Seon Mi Han, and Mark Nagurka. ''Probability Models in Engineering and Science''. CRC Press, 2005.〕 It is possible to generalize this to functions of more than one variable using multivariate Taylor expansions. For example, : 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Taylor expansions for the moments of functions of random variables」の詳細全文を読む スポンサード リンク
|